icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
9 Jan, 2024 17:50

AI poses a threat to our planet, but not the one you might think

The energy required to run high processing chips and cooling systems makes AI like oil – lucrative for humans but with an environmental cost
AI poses a threat to our planet, but not the one you might think

Even as humanity eagerly embraces artificial intelligence despite misgivings on the academic and security fronts, AI’s hunger for energy and its carbon footprint are causing increasing concern. AI is often compared with fossil fuel. Oil, once mined and refined, is a lucrative commodity; and like oil, AI has a large environmental impact that has surprised many.

An article in the MIT Technology Review says the life cycle for training common large AI models has a significant environmental impact, reporting that “the entire process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes the manufacture of the car itself).”

A research article by Alex de Vries of the VU Amsterdam School of Business and Economics also raises concerns over the electricity consumption of accelerated development in computation, and the potential environmental impact of AI and data centers. “In recent years, data centers electricity consumption has accounted for a relatively stable 1% of global electricity use, excluding cryptocurrency mining,” de Vries says.

How AI data centers work

An MIT study says that a decade ago, “most NLP (Natural Language Processing) models could be trained and developed on a commodity laptop or server.” But AI data centers now require multiple instances of specialized hardware such as Graphics Processing Units (GPUs) or Tensor Processing Units (TPUs).

“The goal of a large language model is to guess what comes next in a body of text,” says an article from the Columbia Climate School. “To achieve this, it first must be trained. Training involves exposing the model to huge amounts of data (possibly hundreds of billions of words) which can come from the internet, books, articles, social media, and specialized datasets.”

This training process takes weeks or months, during which an AI model figures out how to perform given tasks accurately by weighing different datasets. 

At first, the AI model makes random guesses to find the correct solution. But with continuous training, it identifies more and more patterns and relationships in the given data to yield accurate and relevant outcomes.

Advances in techniques and hardware for training neural networks in recent years have enabled “impressive accuracy improvements across many fundamental NLP tasks.”

“As a result, training a state-of-the-art model now requires substantial computational resources which demand considerable energy, along with the associated financial and environmental costs,” the MIT study adds.

AI data center energy demand and carbon footprint

The rapid expansion and large-scale application of AI in 2022 and 2023 after the launch of OpenAI’s ChatGPT has driven the development of large language models (LLM) by major tech companies such as Microsoft and Alphabet (Google).

The success of ChatGPT (which reached an unprecedented 100 million users in two months), influenced Microsoft and Google to launch their own AI chatbots, Bing Chat and Bard respectively, Vries article recounts.

Vries told RT: “We already know that data centers represent 1% of global electricity consumption. Thanks to digital trends such as cryptocurrency mining and AI, this can easily grow to 2% and more in the coming years.”

The MIT study has estimated that cloud computing has a larger carbon footprint than the entire airline industry. Furthermore, one data center may require the same amount of electricity needed to power around 50,000 homes.

Electricity is required to run high performance chips and cooling systems, as processors heat up while analyzing enormous amounts of data and producing accurate responses.

De Vries’ study says that Hugging Face’s “BigScience Large Open-Science Open-Access Multilingual (BLOOM) model consumed 433 MWh of electricity during training.” 

“Other LLMs, including GPT-3, Gopher and Open Pre-trained Transformer (OPT), reportedly use 1287, 1066, and 324 MWh respectively for training. Each of these LLMs was trained on terabytes of data and has 175 billion or more parameters,” the study adds.

De Vries quoted research firm SemiAnalysis in his paper, which suggested that OpenAI required 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 GPUs to support ChatGPT, implying an energy demand of 564 MWh per day. 

“Google reported that 60% of AI-related energy consumption from 2019 to 2021 stemmed from inference (where live data is run through an AI model). Google’s parent company, Alphabet, also expressed concern regarding the costs of inference compared to the costs of training,” it added.

A study by researchers at the University of California at Berkeley estimated that GPT-3, which ChatGPT is modeled on, had 175 billion parameters that produced 502 metric tons of CO2 during its training phase, while its daily carbon emissions stood at 50 pounds (or 8.4 tons a year).

The debate on AI viability and future actions

De Vries says that the higher energy demand for data centers will typically be met by fossil fuel. “We only have a limited supply of renewables and we (have) already prioritized those, so any extra demand will be powered by fossil fuels that we need to get rid of,” he told RT. “Even if we put renewables in AI, something else somewhere else will have to be powered with fossil fuel – which will only exacerbate climate change.”

Avik Sarkar, a professor at the Indian School of Business and former head of India’s Niti Aayog’s data analytics center, finds the debate around AI’s energy demands and carbon footprint to be trivial. He worked on an analysis with the International Energy Agency (IEA) in 2018 on the growth of data centers in India and its impact on energy consumption in the country.

“The footprint of AI on energy consumption is minuscule, and many technologies are guzzling vast amounts of energy,” he told RT. “Look at any high street in the major cities, the amount of lighting in the billboards is so enormous that the lights are visible from the outer space, referred to as night lights, which is a great indicator of development and economic growth. Energy consumption is a natural effect of urbanization, capitalism and economic growth – we have to learn to live with that reality.”

Commenting on AI data centers’ energy demand and the impact of carbon emissions, de Vries says the issue is not just confined to India, and that climate change is a global problem. “If we push up both power demand and carbon emissions as a result of AI, this will also affect all vulnerable countries,” he said.

Sarkar does concede that massive amounts of energy consumption for AI happen due to large data centers providing storage and computing infrastructure. There is a further energy impact from the water utilized for cooling the data centers. 

Sarkar pointed out that most global data centers are based outside India, arguing that the country does not presently face a major challenge. With the exception of personal data, other Indian data points can be stored in centers outside the country. 

“Critical data related to financial transactions, Aadhar, or healthcare data need to reside in India and it would be enormous. India has different climate zones and can mitigate the high energy consumption by locating these data centers in cooler, non-seismic zones in the country,” he suggested.

According to De Vries, the good news is that there are bottlenecks in the AI server supply chain, which means that growth is somewhat constrained in the near term. “We should use this opportunity to think about responsible application of AI and ensure that transparency is also provided where AI is being used so that we can properly assess the impact this technology has,” he said.

Podcasts
0:00
12:56
0:00
14:16